Перевод: со всех языков на русский

с русского на все языки

metric log

См. также в других словарях:

  • Metric space — In mathematics, a metric space is a set where a notion of distance (called a metric) between elements of the set is defined. The metric space which most closely corresponds to our intuitive understanding of space is the 3 dimensional Euclidean… …   Wikipedia

  • Poincaré metric — In mathematics, the Poincaré metric, named after Henri Poincaré, is the metric tensor describing a two dimensional surface of constant negative curvature. It is the natural metric commonly used in a variety of calculations in hyperbolic geometry… …   Wikipedia

  • Fisher information metric — In information geometry, the Fisher information metric is a particular Riemannian metric which can be defined on a smooth statistical manifold, i.e., a smooth manifold whose points are probability measures defined on a common probability space.… …   Wikipedia

  • Bergman metric — In differential geometry, the Bergman metric is a Hermitian metric that can be defined on certain types of complex manifold. It is so called because it is derived from the Bergman kernel.DefinitionLet G subset {mathbb{C^n be a domain and let… …   Wikipedia

  • каротажная диаграмма с метрической шкалой глубины — — [http://slovarionline.ru/anglo russkiy slovar neftegazovoy promyishlennosti/] Тематики нефтегазовая промышленность EN metric log …   Справочник технического переводчика

  • Extremal length — In the mathematical theory of conformal and quasiconformal mappings, the extremal length of a collection of curves Gamma is a conformal invariant of Gamma. More specifically, suppose thatD is an open set in the complex plane and Gamma is a… …   Wikipedia

  • Mutual information — Individual (H(X),H(Y)), joint (H(X,Y)), and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X; Y). In probability theory and information theory, the mutual information (sometimes known by the archaic term… …   Wikipedia

  • Kullback–Leibler divergence — In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence, information gain, relative entropy, or KLIC) is a non symmetric measure of the difference between two probability distributions P …   Wikipedia

  • Web analytics — Internet marketing Display advertising Email marketing E mail marketing software Interactive advertising Cloud marketing Social media optimization …   Wikipedia

  • Industrial Review — ▪ 1994 Introduction       The period since 1990 was proving a difficult time for the older industrialized economies, which had suffered from prolonged recession at home, and also for the previously centrally planned economies of Eastern Europe… …   Universalium

  • Travelling salesman problem — The travelling salesman problem (TSP) is an NP hard problem in combinatorial optimization studied in operations research and theoretical computer science. Given a list of cities and their pairwise distances, the task is to find a shortest… …   Wikipedia

Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»